Skip to content

[DOCS] Synchs and links hyperparameter descriptions#55827

Merged
lcawl merged 3 commits intoelastic:masterfrom
lcawl:hyperparameters
May 4, 2020
Merged

[DOCS] Synchs and links hyperparameter descriptions#55827
lcawl merged 3 commits intoelastic:masterfrom
lcawl:hyperparameters

Conversation

@lcawl
Copy link
Contributor

@lcawl lcawl commented Apr 27, 2020

This PR depends on elastic/stack-docs#990
It adds links to the new hyperparameter optimization concept.
It also synchs the descriptions of hyperparameters between the create DFA jobs API and the get DFA job stats APIs.
Lastly, it adds a sentence to the parameters affected by hyperparameter optimization ("By default, this value is calculated during hyperparameter optimization") so that it's clear which ones we optimize.

Preview:

@lcawl lcawl added >docs General docs changes WIP :ml Machine learning v8.0.0 v7.8.0 v7.7.1 labels Apr 27, 2020
@elasticmachine
Copy link
Collaborator

Pinging @elastic/es-docs (>docs)

@elasticmachine
Copy link
Collaborator

Pinging @elastic/ml-core (:ml)

@lcawl lcawl force-pushed the hyperparameters branch from d9fe6bd to be479ff Compare April 30, 2020 18:56
@lcawl lcawl removed the WIP label Apr 30, 2020
@lcawl lcawl requested a review from szabosteve April 30, 2020 21:03
@lcawl lcawl marked this pull request as ready for review April 30, 2020 21:03
Copy link
Contributor

@szabosteve szabosteve left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Thank you for these changes, LGTM!

@lcawl lcawl merged commit 52a2f76 into elastic:master May 4, 2020
@lcawl lcawl deleted the hyperparameters branch May 4, 2020 14:37
@lcawl lcawl added v7.7.0 and removed v7.7.1 labels May 4, 2020
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment

Labels

>docs General docs changes :ml Machine learning v7.7.0 v7.8.0 v8.0.0-alpha1

Projects

None yet

Development

Successfully merging this pull request may close these issues.

4 participants